[ Wed Sep 28 02:18:53 2022 ] using warm up, epoch: 5
[ Wed Sep 28 02:19:08 2022 ] Parameters:
{'work_dir': 'work_dir/ntu60/csub/fc_bone', 'model_saved_name': 'work_dir/ntu60/csub/fc_bone/runs', 'config': 'config/nturgbd-cross-subject/fc_bone.yaml', 'phase': 'train', 'save_score': False, 'joint_label': [], 'seed': 1, 'log_interval': 100, 'save_interval': 1, 'save_epoch': 35, 'eval_interval': 5, 'ema': False, 'print_log': True, 'show_topk': [1, 5], 'feeder': 'feeders.feeder_ntu.Feeder', 'num_worker': 48, 'train_feeder_args': {'data_path': 'data/ntu60/NTU60_CS.npz', 'split': 'train', 'debug': False, 'random_choose': False, 'random_shift': False, 'random_move': False, 'window_size': 64, 'normalization': False, 'random_rot': True, 'p_interval': [0.5, 1], 'vel': False, 'bone': True}, 'test_feeder_args': {'data_path': 'data/ntu60/NTU60_CS.npz', 'split': 'test', 'window_size': 64, 'p_interval': [0.95], 'vel': False, 'bone': True, 'debug': False}, 'model': 'model.FC-Chains_L_multi_head_new_12_layers.Model', 'model_args': {'num_class': 60, 'num_point': 25, 'num_person': 2}, 'weights': None, 'ignore_weights': [], 'base_lr': 0.1, 'step': [90, 100], 'device': [1], 'optimizer': 'SGD', 'nesterov': True, 'momentum': 0.9, 'batch_size': 64, 'test_batch_size': 64, 'start_epoch': 0, 'num_epoch': 110, 'weight_decay': 0.0004, 'lr_decay_rate': 0.1, 'warm_up_epoch': 5}

[ Wed Sep 28 02:19:08 2022 ] # Parameters: 2082097
[ Wed Sep 28 02:19:08 2022 ] Training epoch: 1
[ Wed Sep 28 02:22:18 2022 ] 	Mean training loss: 2.7411. loss2: 0.0000. Mean training acc: 27.61%.
[ Wed Sep 28 02:22:18 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 02:22:18 2022 ] Eval epoch: 1
[ Wed Sep 28 02:22:48 2022 ] 	Mean test loss of 258 batches: 1.8339599412541057.
[ Wed Sep 28 02:22:48 2022 ] 	Top1: 46.28%
[ Wed Sep 28 02:22:49 2022 ] 	Top5: 82.48%
[ Wed Sep 28 02:22:49 2022 ] Training epoch: 2
[ Wed Sep 28 02:25:56 2022 ] 	Mean training loss: 1.6140. loss2: 0.0000. Mean training acc: 51.57%.
[ Wed Sep 28 02:25:56 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 02:25:56 2022 ] Eval epoch: 2
[ Wed Sep 28 02:26:26 2022 ] 	Mean test loss of 258 batches: 1.3516786821590838.
[ Wed Sep 28 02:26:26 2022 ] 	Top1: 59.93%
[ Wed Sep 28 02:26:26 2022 ] 	Top5: 90.03%
[ Wed Sep 28 02:26:26 2022 ] Training epoch: 3
[ Wed Sep 28 02:29:34 2022 ] 	Mean training loss: 1.2056. loss2: 0.0000. Mean training acc: 63.19%.
[ Wed Sep 28 02:29:34 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 02:29:34 2022 ] Eval epoch: 3
[ Wed Sep 28 02:30:03 2022 ] 	Mean test loss of 258 batches: 1.0907374574232471.
[ Wed Sep 28 02:30:03 2022 ] 	Top1: 67.27%
[ Wed Sep 28 02:30:03 2022 ] 	Top5: 93.10%
[ Wed Sep 28 02:30:03 2022 ] Training epoch: 4
[ Wed Sep 28 02:33:11 2022 ] 	Mean training loss: 1.0396. loss2: 0.0000. Mean training acc: 68.10%.
[ Wed Sep 28 02:33:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:33:11 2022 ] Eval epoch: 4
[ Wed Sep 28 02:33:41 2022 ] 	Mean test loss of 258 batches: 1.034266979079838.
[ Wed Sep 28 02:33:41 2022 ] 	Top1: 68.47%
[ Wed Sep 28 02:33:41 2022 ] 	Top5: 93.16%
[ Wed Sep 28 02:33:41 2022 ] Training epoch: 5
[ Wed Sep 28 02:36:49 2022 ] 	Mean training loss: 0.9470. loss2: 0.0000. Mean training acc: 70.61%.
[ Wed Sep 28 02:36:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:36:49 2022 ] Eval epoch: 5
[ Wed Sep 28 02:37:18 2022 ] 	Mean test loss of 258 batches: 1.2350768823494283.
[ Wed Sep 28 02:37:18 2022 ] 	Top1: 64.90%
[ Wed Sep 28 02:37:18 2022 ] 	Top5: 91.84%
[ Wed Sep 28 02:37:18 2022 ] Training epoch: 6
[ Wed Sep 28 02:40:26 2022 ] 	Mean training loss: 0.8444. loss2: 0.0000. Mean training acc: 73.53%.
[ Wed Sep 28 02:40:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:40:27 2022 ] Eval epoch: 6
[ Wed Sep 28 02:40:56 2022 ] 	Mean test loss of 258 batches: 0.9500047839427179.
[ Wed Sep 28 02:40:56 2022 ] 	Top1: 71.24%
[ Wed Sep 28 02:40:56 2022 ] 	Top5: 94.50%
[ Wed Sep 28 02:40:56 2022 ] Training epoch: 7
[ Wed Sep 28 02:44:04 2022 ] 	Mean training loss: 0.7727. loss2: 0.0000. Mean training acc: 75.78%.
[ Wed Sep 28 02:44:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:44:04 2022 ] Eval epoch: 7
[ Wed Sep 28 02:44:33 2022 ] 	Mean test loss of 258 batches: 0.9115906008916308.
[ Wed Sep 28 02:44:34 2022 ] 	Top1: 72.43%
[ Wed Sep 28 02:44:34 2022 ] 	Top5: 93.59%
[ Wed Sep 28 02:44:34 2022 ] Training epoch: 8
[ Wed Sep 28 02:47:42 2022 ] 	Mean training loss: 0.7223. loss2: 0.0000. Mean training acc: 77.35%.
[ Wed Sep 28 02:47:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:47:42 2022 ] Eval epoch: 8
[ Wed Sep 28 02:48:11 2022 ] 	Mean test loss of 258 batches: 0.9191599468621172.
[ Wed Sep 28 02:48:11 2022 ] 	Top1: 72.87%
[ Wed Sep 28 02:48:11 2022 ] 	Top5: 94.07%
[ Wed Sep 28 02:48:11 2022 ] Training epoch: 9
[ Wed Sep 28 02:51:19 2022 ] 	Mean training loss: 0.6836. loss2: 0.0000. Mean training acc: 78.48%.
[ Wed Sep 28 02:51:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:51:19 2022 ] Eval epoch: 9
[ Wed Sep 28 02:51:48 2022 ] 	Mean test loss of 258 batches: 0.830725684877514.
[ Wed Sep 28 02:51:48 2022 ] 	Top1: 75.33%
[ Wed Sep 28 02:51:48 2022 ] 	Top5: 95.47%
[ Wed Sep 28 02:51:48 2022 ] Training epoch: 10
[ Wed Sep 28 02:54:57 2022 ] 	Mean training loss: 0.6676. loss2: 0.0000. Mean training acc: 78.73%.
[ Wed Sep 28 02:54:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:54:57 2022 ] Eval epoch: 10
[ Wed Sep 28 02:55:26 2022 ] 	Mean test loss of 258 batches: 0.7073885458846425.
[ Wed Sep 28 02:55:26 2022 ] 	Top1: 78.30%
[ Wed Sep 28 02:55:26 2022 ] 	Top5: 96.00%
[ Wed Sep 28 02:55:26 2022 ] Training epoch: 11
[ Wed Sep 28 02:58:35 2022 ] 	Mean training loss: 0.6351. loss2: 0.0000. Mean training acc: 79.91%.
[ Wed Sep 28 02:58:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:58:35 2022 ] Eval epoch: 11
[ Wed Sep 28 02:59:04 2022 ] 	Mean test loss of 258 batches: 0.6647566251976545.
[ Wed Sep 28 02:59:04 2022 ] 	Top1: 79.40%
[ Wed Sep 28 02:59:04 2022 ] 	Top5: 96.49%
[ Wed Sep 28 02:59:04 2022 ] Training epoch: 12
[ Wed Sep 28 03:02:12 2022 ] 	Mean training loss: 0.6299. loss2: 0.0000. Mean training acc: 80.21%.
[ Wed Sep 28 03:02:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:02:12 2022 ] Eval epoch: 12
[ Wed Sep 28 03:02:42 2022 ] 	Mean test loss of 258 batches: 0.7737169346606084.
[ Wed Sep 28 03:02:42 2022 ] 	Top1: 77.95%
[ Wed Sep 28 03:02:42 2022 ] 	Top5: 95.66%
[ Wed Sep 28 03:02:42 2022 ] Training epoch: 13
[ Wed Sep 28 03:05:50 2022 ] 	Mean training loss: 0.5941. loss2: 0.0000. Mean training acc: 81.25%.
[ Wed Sep 28 03:05:50 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 03:05:50 2022 ] Eval epoch: 13
[ Wed Sep 28 03:06:19 2022 ] 	Mean test loss of 258 batches: 0.7482273113704467.
[ Wed Sep 28 03:06:19 2022 ] 	Top1: 77.07%
[ Wed Sep 28 03:06:19 2022 ] 	Top5: 96.14%
[ Wed Sep 28 03:06:19 2022 ] Training epoch: 14
[ Wed Sep 28 03:09:27 2022 ] 	Mean training loss: 0.5824. loss2: 0.0000. Mean training acc: 81.63%.
[ Wed Sep 28 03:09:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:09:27 2022 ] Eval epoch: 14
[ Wed Sep 28 03:09:57 2022 ] 	Mean test loss of 258 batches: 0.6892956895537155.
[ Wed Sep 28 03:09:57 2022 ] 	Top1: 79.18%
[ Wed Sep 28 03:09:57 2022 ] 	Top5: 96.62%
[ Wed Sep 28 03:09:57 2022 ] Training epoch: 15
[ Wed Sep 28 03:13:05 2022 ] 	Mean training loss: 0.5678. loss2: 0.0000. Mean training acc: 81.90%.
[ Wed Sep 28 03:13:05 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 03:13:05 2022 ] Eval epoch: 15
[ Wed Sep 28 03:13:34 2022 ] 	Mean test loss of 258 batches: 0.6823704030162604.
[ Wed Sep 28 03:13:34 2022 ] 	Top1: 79.01%
[ Wed Sep 28 03:13:34 2022 ] 	Top5: 96.66%
[ Wed Sep 28 03:13:34 2022 ] Training epoch: 16
[ Wed Sep 28 03:16:42 2022 ] 	Mean training loss: 0.5575. loss2: 0.0000. Mean training acc: 82.19%.
[ Wed Sep 28 03:16:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:16:42 2022 ] Eval epoch: 16
[ Wed Sep 28 03:17:12 2022 ] 	Mean test loss of 258 batches: 0.8036574291628461.
[ Wed Sep 28 03:17:12 2022 ] 	Top1: 76.51%
[ Wed Sep 28 03:17:12 2022 ] 	Top5: 96.02%
[ Wed Sep 28 03:17:12 2022 ] Training epoch: 17
[ Wed Sep 28 03:20:20 2022 ] 	Mean training loss: 0.5420. loss2: 0.0000. Mean training acc: 82.92%.
[ Wed Sep 28 03:20:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:20:20 2022 ] Eval epoch: 17
[ Wed Sep 28 03:20:49 2022 ] 	Mean test loss of 258 batches: 0.6875504328645476.
[ Wed Sep 28 03:20:49 2022 ] 	Top1: 79.12%
[ Wed Sep 28 03:20:49 2022 ] 	Top5: 96.28%
[ Wed Sep 28 03:20:49 2022 ] Training epoch: 18
[ Wed Sep 28 03:23:58 2022 ] 	Mean training loss: 0.5425. loss2: 0.0000. Mean training acc: 82.84%.
[ Wed Sep 28 03:23:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:23:58 2022 ] Eval epoch: 18
[ Wed Sep 28 03:24:28 2022 ] 	Mean test loss of 258 batches: 0.7428745086225428.
[ Wed Sep 28 03:24:28 2022 ] 	Top1: 78.30%
[ Wed Sep 28 03:24:28 2022 ] 	Top5: 95.86%
[ Wed Sep 28 03:24:28 2022 ] Training epoch: 19
[ Wed Sep 28 03:27:36 2022 ] 	Mean training loss: 0.5321. loss2: 0.0000. Mean training acc: 83.21%.
[ Wed Sep 28 03:27:36 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 03:27:36 2022 ] Eval epoch: 19
[ Wed Sep 28 03:28:05 2022 ] 	Mean test loss of 258 batches: 0.8849176183458447.
[ Wed Sep 28 03:28:05 2022 ] 	Top1: 75.10%
[ Wed Sep 28 03:28:05 2022 ] 	Top5: 95.26%
[ Wed Sep 28 03:28:05 2022 ] Training epoch: 20
[ Wed Sep 28 03:31:14 2022 ] 	Mean training loss: 0.5179. loss2: 0.0000. Mean training acc: 83.60%.
[ Wed Sep 28 03:31:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:31:14 2022 ] Eval epoch: 20
[ Wed Sep 28 03:31:43 2022 ] 	Mean test loss of 258 batches: 0.7141655171795409.
[ Wed Sep 28 03:31:43 2022 ] 	Top1: 78.38%
[ Wed Sep 28 03:31:43 2022 ] 	Top5: 96.08%
[ Wed Sep 28 03:31:43 2022 ] Training epoch: 21
[ Wed Sep 28 03:34:52 2022 ] 	Mean training loss: 0.5156. loss2: 0.0000. Mean training acc: 83.80%.
[ Wed Sep 28 03:34:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:34:52 2022 ] Eval epoch: 21
[ Wed Sep 28 03:35:21 2022 ] 	Mean test loss of 258 batches: 0.6435070958479431.
[ Wed Sep 28 03:35:21 2022 ] 	Top1: 81.15%
[ Wed Sep 28 03:35:21 2022 ] 	Top5: 96.53%
[ Wed Sep 28 03:35:21 2022 ] Training epoch: 22
[ Wed Sep 28 03:38:30 2022 ] 	Mean training loss: 0.5128. loss2: 0.0000. Mean training acc: 83.73%.
[ Wed Sep 28 03:38:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:38:30 2022 ] Eval epoch: 22
[ Wed Sep 28 03:38:59 2022 ] 	Mean test loss of 258 batches: 0.6670986306528712.
[ Wed Sep 28 03:38:59 2022 ] 	Top1: 80.66%
[ Wed Sep 28 03:38:59 2022 ] 	Top5: 96.35%
[ Wed Sep 28 03:38:59 2022 ] Training epoch: 23
[ Wed Sep 28 03:42:07 2022 ] 	Mean training loss: 0.5080. loss2: 0.0000. Mean training acc: 83.95%.
[ Wed Sep 28 03:42:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:42:07 2022 ] Eval epoch: 23
[ Wed Sep 28 03:42:37 2022 ] 	Mean test loss of 258 batches: 0.6647879261263582.
[ Wed Sep 28 03:42:37 2022 ] 	Top1: 80.01%
[ Wed Sep 28 03:42:37 2022 ] 	Top5: 96.32%
[ Wed Sep 28 03:42:37 2022 ] Training epoch: 24
[ Wed Sep 28 03:45:45 2022 ] 	Mean training loss: 0.4975. loss2: 0.0000. Mean training acc: 84.39%.
[ Wed Sep 28 03:45:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:45:45 2022 ] Eval epoch: 24
[ Wed Sep 28 03:46:15 2022 ] 	Mean test loss of 258 batches: 0.8769528038742006.
[ Wed Sep 28 03:46:15 2022 ] 	Top1: 74.42%
[ Wed Sep 28 03:46:15 2022 ] 	Top5: 94.95%
[ Wed Sep 28 03:46:15 2022 ] Training epoch: 25
[ Wed Sep 28 03:49:23 2022 ] 	Mean training loss: 0.4928. loss2: 0.0000. Mean training acc: 84.42%.
[ Wed Sep 28 03:49:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:49:23 2022 ] Eval epoch: 25
[ Wed Sep 28 03:49:52 2022 ] 	Mean test loss of 258 batches: 0.7142238503163175.
[ Wed Sep 28 03:49:52 2022 ] 	Top1: 79.18%
[ Wed Sep 28 03:49:52 2022 ] 	Top5: 96.45%
[ Wed Sep 28 03:49:52 2022 ] Training epoch: 26
[ Wed Sep 28 03:53:01 2022 ] 	Mean training loss: 0.4955. loss2: 0.0000. Mean training acc: 84.35%.
[ Wed Sep 28 03:53:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:53:01 2022 ] Eval epoch: 26
[ Wed Sep 28 03:53:30 2022 ] 	Mean test loss of 258 batches: 0.5716066973269448.
[ Wed Sep 28 03:53:30 2022 ] 	Top1: 82.61%
[ Wed Sep 28 03:53:30 2022 ] 	Top5: 97.25%
[ Wed Sep 28 03:53:30 2022 ] Training epoch: 27
[ Wed Sep 28 03:56:39 2022 ] 	Mean training loss: 0.4874. loss2: 0.0000. Mean training acc: 84.46%.
[ Wed Sep 28 03:56:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:56:39 2022 ] Eval epoch: 27
[ Wed Sep 28 03:57:08 2022 ] 	Mean test loss of 258 batches: 0.6335456357445828.
[ Wed Sep 28 03:57:08 2022 ] 	Top1: 81.17%
[ Wed Sep 28 03:57:08 2022 ] 	Top5: 96.34%
[ Wed Sep 28 03:57:08 2022 ] Training epoch: 28
[ Wed Sep 28 04:00:17 2022 ] 	Mean training loss: 0.4874. loss2: 0.0000. Mean training acc: 84.58%.
[ Wed Sep 28 04:00:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:00:17 2022 ] Eval epoch: 28
[ Wed Sep 28 04:00:46 2022 ] 	Mean test loss of 258 batches: 0.6679241646860921.
[ Wed Sep 28 04:00:46 2022 ] 	Top1: 80.54%
[ Wed Sep 28 04:00:46 2022 ] 	Top5: 96.31%
[ Wed Sep 28 04:00:46 2022 ] Training epoch: 29
[ Wed Sep 28 04:03:54 2022 ] 	Mean training loss: 0.4862. loss2: 0.0000. Mean training acc: 84.70%.
[ Wed Sep 28 04:03:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:03:54 2022 ] Eval epoch: 29
[ Wed Sep 28 04:04:24 2022 ] 	Mean test loss of 258 batches: 0.707821100545946.
[ Wed Sep 28 04:04:24 2022 ] 	Top1: 79.65%
[ Wed Sep 28 04:04:24 2022 ] 	Top5: 95.91%
[ Wed Sep 28 04:04:24 2022 ] Training epoch: 30
[ Wed Sep 28 04:07:32 2022 ] 	Mean training loss: 0.4775. loss2: 0.0000. Mean training acc: 85.11%.
[ Wed Sep 28 04:07:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:07:32 2022 ] Eval epoch: 30
[ Wed Sep 28 04:08:01 2022 ] 	Mean test loss of 258 batches: 0.5674206289325574.
[ Wed Sep 28 04:08:01 2022 ] 	Top1: 82.37%
[ Wed Sep 28 04:08:01 2022 ] 	Top5: 97.34%
[ Wed Sep 28 04:08:01 2022 ] Training epoch: 31
[ Wed Sep 28 04:11:10 2022 ] 	Mean training loss: 0.4799. loss2: 0.0000. Mean training acc: 84.83%.
[ Wed Sep 28 04:11:10 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:11:10 2022 ] Eval epoch: 31
[ Wed Sep 28 04:11:39 2022 ] 	Mean test loss of 258 batches: 0.6669517566644868.
[ Wed Sep 28 04:11:39 2022 ] 	Top1: 80.21%
[ Wed Sep 28 04:11:39 2022 ] 	Top5: 96.88%
[ Wed Sep 28 04:11:39 2022 ] Training epoch: 32
[ Wed Sep 28 04:14:48 2022 ] 	Mean training loss: 0.4840. loss2: 0.0000. Mean training acc: 84.80%.
[ Wed Sep 28 04:14:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:14:48 2022 ] Eval epoch: 32
[ Wed Sep 28 04:15:17 2022 ] 	Mean test loss of 258 batches: 0.7953194513801456.
[ Wed Sep 28 04:15:17 2022 ] 	Top1: 76.56%
[ Wed Sep 28 04:15:17 2022 ] 	Top5: 96.03%
[ Wed Sep 28 04:15:17 2022 ] Training epoch: 33
[ Wed Sep 28 04:18:25 2022 ] 	Mean training loss: 0.4781. loss2: 0.0000. Mean training acc: 84.81%.
[ Wed Sep 28 04:18:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:18:25 2022 ] Eval epoch: 33
[ Wed Sep 28 04:18:55 2022 ] 	Mean test loss of 258 batches: 0.6784725163218587.
[ Wed Sep 28 04:18:55 2022 ] 	Top1: 79.29%
[ Wed Sep 28 04:18:55 2022 ] 	Top5: 96.71%
[ Wed Sep 28 04:18:55 2022 ] Training epoch: 34
[ Wed Sep 28 04:22:04 2022 ] 	Mean training loss: 0.4735. loss2: 0.0000. Mean training acc: 85.19%.
[ Wed Sep 28 04:22:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:22:04 2022 ] Eval epoch: 34
[ Wed Sep 28 04:22:33 2022 ] 	Mean test loss of 258 batches: 0.6618149862155434.
[ Wed Sep 28 04:22:33 2022 ] 	Top1: 80.57%
[ Wed Sep 28 04:22:33 2022 ] 	Top5: 96.79%
[ Wed Sep 28 04:22:33 2022 ] Training epoch: 35
[ Wed Sep 28 04:25:42 2022 ] 	Mean training loss: 0.4705. loss2: 0.0000. Mean training acc: 84.95%.
[ Wed Sep 28 04:25:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:25:42 2022 ] Eval epoch: 35
[ Wed Sep 28 04:26:11 2022 ] 	Mean test loss of 258 batches: 0.7049794714580211.
[ Wed Sep 28 04:26:11 2022 ] 	Top1: 79.17%
[ Wed Sep 28 04:26:11 2022 ] 	Top5: 96.04%
[ Wed Sep 28 04:26:11 2022 ] Training epoch: 36
[ Wed Sep 28 04:29:19 2022 ] 	Mean training loss: 0.4655. loss2: 0.0000. Mean training acc: 85.40%.
[ Wed Sep 28 04:29:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:29:19 2022 ] Eval epoch: 36
[ Wed Sep 28 04:29:48 2022 ] 	Mean test loss of 258 batches: 0.5831783636826877.
[ Wed Sep 28 04:29:49 2022 ] 	Top1: 82.46%
[ Wed Sep 28 04:29:49 2022 ] 	Top5: 96.67%
[ Wed Sep 28 04:29:49 2022 ] Training epoch: 37
[ Wed Sep 28 04:32:57 2022 ] 	Mean training loss: 0.4749. loss2: 0.0000. Mean training acc: 84.86%.
[ Wed Sep 28 04:32:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:32:57 2022 ] Eval epoch: 37
[ Wed Sep 28 04:33:26 2022 ] 	Mean test loss of 258 batches: 0.6418631412955218.
[ Wed Sep 28 04:33:26 2022 ] 	Top1: 81.42%
[ Wed Sep 28 04:33:26 2022 ] 	Top5: 96.35%
[ Wed Sep 28 04:33:26 2022 ] Training epoch: 38
[ Wed Sep 28 04:36:35 2022 ] 	Mean training loss: 0.4657. loss2: 0.0000. Mean training acc: 85.12%.
[ Wed Sep 28 04:36:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:36:35 2022 ] Eval epoch: 38
[ Wed Sep 28 04:37:04 2022 ] 	Mean test loss of 258 batches: 0.6085747905479845.
[ Wed Sep 28 04:37:04 2022 ] 	Top1: 81.37%
[ Wed Sep 28 04:37:04 2022 ] 	Top5: 96.88%
[ Wed Sep 28 04:37:04 2022 ] Training epoch: 39
[ Wed Sep 28 04:40:13 2022 ] 	Mean training loss: 0.4646. loss2: 0.0000. Mean training acc: 85.39%.
[ Wed Sep 28 04:40:13 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:40:13 2022 ] Eval epoch: 39
[ Wed Sep 28 04:40:42 2022 ] 	Mean test loss of 258 batches: 0.7151490392148957.
[ Wed Sep 28 04:40:42 2022 ] 	Top1: 79.46%
[ Wed Sep 28 04:40:42 2022 ] 	Top5: 96.46%
[ Wed Sep 28 04:40:42 2022 ] Training epoch: 40
[ Wed Sep 28 04:43:51 2022 ] 	Mean training loss: 0.4584. loss2: 0.0000. Mean training acc: 85.45%.
[ Wed Sep 28 04:43:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:43:51 2022 ] Eval epoch: 40
[ Wed Sep 28 04:44:20 2022 ] 	Mean test loss of 258 batches: 0.619123868586481.
[ Wed Sep 28 04:44:20 2022 ] 	Top1: 81.39%
[ Wed Sep 28 04:44:20 2022 ] 	Top5: 97.24%
[ Wed Sep 28 04:44:20 2022 ] Training epoch: 41
[ Wed Sep 28 04:47:29 2022 ] 	Mean training loss: 0.4621. loss2: 0.0000. Mean training acc: 85.37%.
[ Wed Sep 28 04:47:29 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:47:29 2022 ] Eval epoch: 41
[ Wed Sep 28 04:47:58 2022 ] 	Mean test loss of 258 batches: 0.666341933804427.
[ Wed Sep 28 04:47:58 2022 ] 	Top1: 80.05%
[ Wed Sep 28 04:47:58 2022 ] 	Top5: 96.68%
[ Wed Sep 28 04:47:58 2022 ] Training epoch: 42
[ Wed Sep 28 04:51:07 2022 ] 	Mean training loss: 0.4546. loss2: 0.0000. Mean training acc: 85.65%.
[ Wed Sep 28 04:51:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:51:07 2022 ] Eval epoch: 42
[ Wed Sep 28 04:51:36 2022 ] 	Mean test loss of 258 batches: 0.6176487130019092.
[ Wed Sep 28 04:51:36 2022 ] 	Top1: 81.72%
[ Wed Sep 28 04:51:37 2022 ] 	Top5: 96.63%
[ Wed Sep 28 04:51:37 2022 ] Training epoch: 43
[ Wed Sep 28 04:54:45 2022 ] 	Mean training loss: 0.4663. loss2: 0.0000. Mean training acc: 85.44%.
[ Wed Sep 28 04:54:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:54:45 2022 ] Eval epoch: 43
[ Wed Sep 28 04:55:14 2022 ] 	Mean test loss of 258 batches: 0.6538950737941173.
[ Wed Sep 28 04:55:15 2022 ] 	Top1: 80.75%
[ Wed Sep 28 04:55:15 2022 ] 	Top5: 96.96%
[ Wed Sep 28 04:55:15 2022 ] Training epoch: 44
[ Wed Sep 28 04:58:23 2022 ] 	Mean training loss: 0.4655. loss2: 0.0000. Mean training acc: 85.31%.
[ Wed Sep 28 04:58:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:58:23 2022 ] Eval epoch: 44
[ Wed Sep 28 04:58:52 2022 ] 	Mean test loss of 258 batches: 0.6330009553321573.
[ Wed Sep 28 04:58:53 2022 ] 	Top1: 81.07%
[ Wed Sep 28 04:58:53 2022 ] 	Top5: 96.77%
[ Wed Sep 28 04:58:53 2022 ] Training epoch: 45
[ Wed Sep 28 05:02:01 2022 ] 	Mean training loss: 0.4539. loss2: 0.0000. Mean training acc: 85.71%.
[ Wed Sep 28 05:02:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:02:01 2022 ] Eval epoch: 45
[ Wed Sep 28 05:02:31 2022 ] 	Mean test loss of 258 batches: 0.7005315753378609.
[ Wed Sep 28 05:02:31 2022 ] 	Top1: 79.71%
[ Wed Sep 28 05:02:31 2022 ] 	Top5: 95.78%
[ Wed Sep 28 05:02:31 2022 ] Training epoch: 46
[ Wed Sep 28 05:05:39 2022 ] 	Mean training loss: 0.4575. loss2: 0.0000. Mean training acc: 85.65%.
[ Wed Sep 28 05:05:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:05:39 2022 ] Eval epoch: 46
[ Wed Sep 28 05:06:08 2022 ] 	Mean test loss of 258 batches: 0.6177382532131764.
[ Wed Sep 28 05:06:08 2022 ] 	Top1: 80.93%
[ Wed Sep 28 05:06:08 2022 ] 	Top5: 96.83%
[ Wed Sep 28 05:06:08 2022 ] Training epoch: 47
[ Wed Sep 28 05:09:16 2022 ] 	Mean training loss: 0.4546. loss2: 0.0000. Mean training acc: 85.63%.
[ Wed Sep 28 05:09:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:09:17 2022 ] Eval epoch: 47
[ Wed Sep 28 05:09:46 2022 ] 	Mean test loss of 258 batches: 0.6599654506227767.
[ Wed Sep 28 05:09:46 2022 ] 	Top1: 81.15%
[ Wed Sep 28 05:09:46 2022 ] 	Top5: 96.05%
[ Wed Sep 28 05:09:46 2022 ] Training epoch: 48
[ Wed Sep 28 05:12:54 2022 ] 	Mean training loss: 0.4565. loss2: 0.0000. Mean training acc: 85.39%.
[ Wed Sep 28 05:12:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:12:54 2022 ] Eval epoch: 48
[ Wed Sep 28 05:13:23 2022 ] 	Mean test loss of 258 batches: 0.6207329240995784.
[ Wed Sep 28 05:13:23 2022 ] 	Top1: 81.59%
[ Wed Sep 28 05:13:23 2022 ] 	Top5: 96.71%
[ Wed Sep 28 05:13:23 2022 ] Training epoch: 49
[ Wed Sep 28 05:16:32 2022 ] 	Mean training loss: 0.4607. loss2: 0.0000. Mean training acc: 85.43%.
[ Wed Sep 28 05:16:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:16:32 2022 ] Eval epoch: 49
[ Wed Sep 28 05:17:01 2022 ] 	Mean test loss of 258 batches: 0.670475622769012.
[ Wed Sep 28 05:17:01 2022 ] 	Top1: 79.83%
[ Wed Sep 28 05:17:01 2022 ] 	Top5: 96.65%
[ Wed Sep 28 05:17:01 2022 ] Training epoch: 50
[ Wed Sep 28 05:20:10 2022 ] 	Mean training loss: 0.4525. loss2: 0.0000. Mean training acc: 85.57%.
[ Wed Sep 28 05:20:10 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:20:10 2022 ] Eval epoch: 50
[ Wed Sep 28 05:20:39 2022 ] 	Mean test loss of 258 batches: 0.6518946077183697.
[ Wed Sep 28 05:20:39 2022 ] 	Top1: 80.51%
[ Wed Sep 28 05:20:39 2022 ] 	Top5: 96.95%
[ Wed Sep 28 05:20:39 2022 ] Training epoch: 51
[ Wed Sep 28 05:23:47 2022 ] 	Mean training loss: 0.4571. loss2: 0.0000. Mean training acc: 85.63%.
[ Wed Sep 28 05:23:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:23:47 2022 ] Eval epoch: 51
[ Wed Sep 28 05:24:17 2022 ] 	Mean test loss of 258 batches: 0.7150924331111501.
[ Wed Sep 28 05:24:17 2022 ] 	Top1: 80.25%
[ Wed Sep 28 05:24:17 2022 ] 	Top5: 95.42%
[ Wed Sep 28 05:24:17 2022 ] Training epoch: 52
[ Wed Sep 28 05:27:26 2022 ] 	Mean training loss: 0.4539. loss2: 0.0000. Mean training acc: 85.69%.
[ Wed Sep 28 05:27:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:27:26 2022 ] Eval epoch: 52
[ Wed Sep 28 05:27:56 2022 ] 	Mean test loss of 258 batches: 0.5571162048005318.
[ Wed Sep 28 05:27:56 2022 ] 	Top1: 83.14%
[ Wed Sep 28 05:27:56 2022 ] 	Top5: 97.31%
[ Wed Sep 28 05:27:56 2022 ] Training epoch: 53
[ Wed Sep 28 05:31:04 2022 ] 	Mean training loss: 0.4433. loss2: 0.0000. Mean training acc: 85.84%.
[ Wed Sep 28 05:31:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:31:04 2022 ] Eval epoch: 53
[ Wed Sep 28 05:31:34 2022 ] 	Mean test loss of 258 batches: 0.5384904135220735.
[ Wed Sep 28 05:31:34 2022 ] 	Top1: 83.66%
[ Wed Sep 28 05:31:34 2022 ] 	Top5: 97.55%
[ Wed Sep 28 05:31:34 2022 ] Training epoch: 54
[ Wed Sep 28 05:34:42 2022 ] 	Mean training loss: 0.4491. loss2: 0.0000. Mean training acc: 85.91%.
[ Wed Sep 28 05:34:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:34:43 2022 ] Eval epoch: 54
[ Wed Sep 28 05:35:12 2022 ] 	Mean test loss of 258 batches: 0.6318401818474134.
[ Wed Sep 28 05:35:12 2022 ] 	Top1: 80.94%
[ Wed Sep 28 05:35:12 2022 ] 	Top5: 97.16%
[ Wed Sep 28 05:35:12 2022 ] Training epoch: 55
[ Wed Sep 28 05:38:20 2022 ] 	Mean training loss: 0.4496. loss2: 0.0000. Mean training acc: 85.72%.
[ Wed Sep 28 05:38:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:38:21 2022 ] Eval epoch: 55
[ Wed Sep 28 05:38:50 2022 ] 	Mean test loss of 258 batches: 0.7090509162508241.
[ Wed Sep 28 05:38:50 2022 ] 	Top1: 79.30%
[ Wed Sep 28 05:38:50 2022 ] 	Top5: 96.02%
[ Wed Sep 28 05:38:50 2022 ] Training epoch: 56
[ Wed Sep 28 05:41:58 2022 ] 	Mean training loss: 0.4498. loss2: 0.0000. Mean training acc: 85.69%.
[ Wed Sep 28 05:41:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:41:59 2022 ] Eval epoch: 56
[ Wed Sep 28 05:42:28 2022 ] 	Mean test loss of 258 batches: 0.6893509289090948.
[ Wed Sep 28 05:42:28 2022 ] 	Top1: 80.23%
[ Wed Sep 28 05:42:28 2022 ] 	Top5: 95.54%
[ Wed Sep 28 05:42:28 2022 ] Training epoch: 57
[ Wed Sep 28 05:45:37 2022 ] 	Mean training loss: 0.4477. loss2: 0.0000. Mean training acc: 85.89%.
[ Wed Sep 28 05:45:37 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:45:37 2022 ] Eval epoch: 57
[ Wed Sep 28 05:46:06 2022 ] 	Mean test loss of 258 batches: 0.7472601521384808.
[ Wed Sep 28 05:46:06 2022 ] 	Top1: 77.74%
[ Wed Sep 28 05:46:06 2022 ] 	Top5: 96.08%
[ Wed Sep 28 05:46:06 2022 ] Training epoch: 58
[ Wed Sep 28 05:49:15 2022 ] 	Mean training loss: 0.4464. loss2: 0.0000. Mean training acc: 85.99%.
[ Wed Sep 28 05:49:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:49:15 2022 ] Eval epoch: 58
[ Wed Sep 28 05:49:44 2022 ] 	Mean test loss of 258 batches: 0.7050603220975676.
[ Wed Sep 28 05:49:44 2022 ] 	Top1: 79.23%
[ Wed Sep 28 05:49:44 2022 ] 	Top5: 96.49%
[ Wed Sep 28 05:49:44 2022 ] Training epoch: 59
[ Wed Sep 28 05:52:53 2022 ] 	Mean training loss: 0.4506. loss2: 0.0000. Mean training acc: 85.52%.
[ Wed Sep 28 05:52:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:52:53 2022 ] Eval epoch: 59
[ Wed Sep 28 05:53:22 2022 ] 	Mean test loss of 258 batches: 0.7167349231913108.
[ Wed Sep 28 05:53:22 2022 ] 	Top1: 78.75%
[ Wed Sep 28 05:53:22 2022 ] 	Top5: 96.39%
[ Wed Sep 28 05:53:22 2022 ] Training epoch: 60
[ Wed Sep 28 05:56:31 2022 ] 	Mean training loss: 0.4441. loss2: 0.0000. Mean training acc: 86.11%.
[ Wed Sep 28 05:56:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:56:31 2022 ] Eval epoch: 60
[ Wed Sep 28 05:57:00 2022 ] 	Mean test loss of 258 batches: 0.6847753101425578.
[ Wed Sep 28 05:57:00 2022 ] 	Top1: 80.15%
[ Wed Sep 28 05:57:00 2022 ] 	Top5: 96.49%
[ Wed Sep 28 05:57:00 2022 ] Training epoch: 61
[ Wed Sep 28 06:00:09 2022 ] 	Mean training loss: 0.4426. loss2: 0.0000. Mean training acc: 85.74%.
[ Wed Sep 28 06:00:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:00:09 2022 ] Eval epoch: 61
[ Wed Sep 28 06:00:38 2022 ] 	Mean test loss of 258 batches: 0.6081996784769288.
[ Wed Sep 28 06:00:38 2022 ] 	Top1: 81.96%
[ Wed Sep 28 06:00:39 2022 ] 	Top5: 96.71%
[ Wed Sep 28 06:00:39 2022 ] Training epoch: 62
[ Wed Sep 28 06:03:47 2022 ] 	Mean training loss: 0.4433. loss2: 0.0000. Mean training acc: 85.97%.
[ Wed Sep 28 06:03:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:03:47 2022 ] Eval epoch: 62
[ Wed Sep 28 06:04:16 2022 ] 	Mean test loss of 258 batches: 0.6929175730708034.
[ Wed Sep 28 06:04:17 2022 ] 	Top1: 79.13%
[ Wed Sep 28 06:04:17 2022 ] 	Top5: 96.62%
[ Wed Sep 28 06:04:17 2022 ] Training epoch: 63
[ Wed Sep 28 06:07:25 2022 ] 	Mean training loss: 0.4463. loss2: 0.0000. Mean training acc: 85.93%.
[ Wed Sep 28 06:07:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:07:25 2022 ] Eval epoch: 63
[ Wed Sep 28 06:07:54 2022 ] 	Mean test loss of 258 batches: 1.0479873432669529.
[ Wed Sep 28 06:07:54 2022 ] 	Top1: 70.38%
[ Wed Sep 28 06:07:55 2022 ] 	Top5: 92.50%
[ Wed Sep 28 06:07:55 2022 ] Training epoch: 64
[ Wed Sep 28 06:11:03 2022 ] 	Mean training loss: 0.4498. loss2: 0.0000. Mean training acc: 85.71%.
[ Wed Sep 28 06:11:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:11:03 2022 ] Eval epoch: 64
[ Wed Sep 28 06:11:33 2022 ] 	Mean test loss of 258 batches: 0.5702283750324286.
[ Wed Sep 28 06:11:33 2022 ] 	Top1: 82.99%
[ Wed Sep 28 06:11:33 2022 ] 	Top5: 97.18%
[ Wed Sep 28 06:11:33 2022 ] Training epoch: 65
[ Wed Sep 28 06:14:42 2022 ] 	Mean training loss: 0.4351. loss2: 0.0000. Mean training acc: 86.36%.
[ Wed Sep 28 06:14:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:14:42 2022 ] Eval epoch: 65
[ Wed Sep 28 06:15:12 2022 ] 	Mean test loss of 258 batches: 0.7059276578731315.
[ Wed Sep 28 06:15:12 2022 ] 	Top1: 78.81%
[ Wed Sep 28 06:15:12 2022 ] 	Top5: 96.58%
[ Wed Sep 28 06:15:12 2022 ] Training epoch: 66
[ Wed Sep 28 06:18:21 2022 ] 	Mean training loss: 0.4537. loss2: 0.0000. Mean training acc: 85.62%.
[ Wed Sep 28 06:18:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:18:21 2022 ] Eval epoch: 66
[ Wed Sep 28 06:18:50 2022 ] 	Mean test loss of 258 batches: 0.6382228062134381.
[ Wed Sep 28 06:18:51 2022 ] 	Top1: 81.55%
[ Wed Sep 28 06:18:51 2022 ] 	Top5: 96.03%
[ Wed Sep 28 06:18:51 2022 ] Training epoch: 67
[ Wed Sep 28 06:21:59 2022 ] 	Mean training loss: 0.4423. loss2: 0.0000. Mean training acc: 85.91%.
[ Wed Sep 28 06:21:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:22:00 2022 ] Eval epoch: 67
[ Wed Sep 28 06:22:29 2022 ] 	Mean test loss of 258 batches: 0.7106402097630871.
[ Wed Sep 28 06:22:29 2022 ] 	Top1: 78.80%
[ Wed Sep 28 06:22:29 2022 ] 	Top5: 95.32%
[ Wed Sep 28 06:22:29 2022 ] Training epoch: 68
[ Wed Sep 28 06:25:38 2022 ] 	Mean training loss: 0.4440. loss2: 0.0000. Mean training acc: 85.97%.
[ Wed Sep 28 06:25:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:25:38 2022 ] Eval epoch: 68
[ Wed Sep 28 06:26:07 2022 ] 	Mean test loss of 258 batches: 0.5749071050406427.
[ Wed Sep 28 06:26:07 2022 ] 	Top1: 82.28%
[ Wed Sep 28 06:26:08 2022 ] 	Top5: 97.17%
[ Wed Sep 28 06:26:08 2022 ] Training epoch: 69
[ Wed Sep 28 06:29:17 2022 ] 	Mean training loss: 0.4475. loss2: 0.0000. Mean training acc: 85.86%.
[ Wed Sep 28 06:29:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:29:17 2022 ] Eval epoch: 69
[ Wed Sep 28 06:29:46 2022 ] 	Mean test loss of 258 batches: 0.6500283525895703.
[ Wed Sep 28 06:29:46 2022 ] 	Top1: 80.57%
[ Wed Sep 28 06:29:46 2022 ] 	Top5: 96.60%
[ Wed Sep 28 06:29:46 2022 ] Training epoch: 70
[ Wed Sep 28 06:32:54 2022 ] 	Mean training loss: 0.4455. loss2: 0.0000. Mean training acc: 86.05%.
[ Wed Sep 28 06:32:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:32:54 2022 ] Eval epoch: 70
[ Wed Sep 28 06:33:24 2022 ] 	Mean test loss of 258 batches: 0.6210426840093709.
[ Wed Sep 28 06:33:24 2022 ] 	Top1: 81.61%
[ Wed Sep 28 06:33:24 2022 ] 	Top5: 96.51%
[ Wed Sep 28 06:33:24 2022 ] Training epoch: 71
[ Wed Sep 28 06:36:32 2022 ] 	Mean training loss: 0.4382. loss2: 0.0000. Mean training acc: 86.17%.
[ Wed Sep 28 06:36:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:36:33 2022 ] Eval epoch: 71
[ Wed Sep 28 06:37:02 2022 ] 	Mean test loss of 258 batches: 0.5091694409186526.
[ Wed Sep 28 06:37:02 2022 ] 	Top1: 84.70%
[ Wed Sep 28 06:37:02 2022 ] 	Top5: 97.48%
[ Wed Sep 28 06:37:02 2022 ] Training epoch: 72
[ Wed Sep 28 06:40:11 2022 ] 	Mean training loss: 0.4415. loss2: 0.0000. Mean training acc: 85.99%.
[ Wed Sep 28 06:40:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:40:11 2022 ] Eval epoch: 72
[ Wed Sep 28 06:40:40 2022 ] 	Mean test loss of 258 batches: 0.809270564091298.
[ Wed Sep 28 06:40:40 2022 ] 	Top1: 77.44%
[ Wed Sep 28 06:40:41 2022 ] 	Top5: 95.82%
[ Wed Sep 28 06:40:41 2022 ] Training epoch: 73
[ Wed Sep 28 06:43:49 2022 ] 	Mean training loss: 0.4453. loss2: 0.0000. Mean training acc: 85.91%.
[ Wed Sep 28 06:43:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:43:49 2022 ] Eval epoch: 73
[ Wed Sep 28 06:44:18 2022 ] 	Mean test loss of 258 batches: 0.5805869637013867.
[ Wed Sep 28 06:44:18 2022 ] 	Top1: 83.17%
[ Wed Sep 28 06:44:18 2022 ] 	Top5: 96.79%
[ Wed Sep 28 06:44:18 2022 ] Training epoch: 74
[ Wed Sep 28 06:47:27 2022 ] 	Mean training loss: 0.4465. loss2: 0.0000. Mean training acc: 85.80%.
[ Wed Sep 28 06:47:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:47:27 2022 ] Eval epoch: 74
[ Wed Sep 28 06:47:56 2022 ] 	Mean test loss of 258 batches: 0.7034571607445561.
[ Wed Sep 28 06:47:56 2022 ] 	Top1: 79.12%
[ Wed Sep 28 06:47:56 2022 ] 	Top5: 96.20%
[ Wed Sep 28 06:47:56 2022 ] Training epoch: 75
[ Wed Sep 28 06:51:05 2022 ] 	Mean training loss: 0.4390. loss2: 0.0000. Mean training acc: 86.01%.
[ Wed Sep 28 06:51:05 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:51:05 2022 ] Eval epoch: 75
[ Wed Sep 28 06:51:34 2022 ] 	Mean test loss of 258 batches: 0.5203733968642331.
[ Wed Sep 28 06:51:34 2022 ] 	Top1: 84.38%
[ Wed Sep 28 06:51:34 2022 ] 	Top5: 97.38%
[ Wed Sep 28 06:51:34 2022 ] Training epoch: 76
[ Wed Sep 28 06:54:43 2022 ] 	Mean training loss: 0.4494. loss2: 0.0000. Mean training acc: 85.94%.
[ Wed Sep 28 06:54:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:54:43 2022 ] Eval epoch: 76
[ Wed Sep 28 06:55:12 2022 ] 	Mean test loss of 258 batches: 0.5217568001081777.
[ Wed Sep 28 06:55:12 2022 ] 	Top1: 84.10%
[ Wed Sep 28 06:55:12 2022 ] 	Top5: 97.81%
[ Wed Sep 28 06:55:12 2022 ] Training epoch: 77
[ Wed Sep 28 06:58:21 2022 ] 	Mean training loss: 0.4440. loss2: 0.0000. Mean training acc: 86.00%.
[ Wed Sep 28 06:58:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:58:21 2022 ] Eval epoch: 77
[ Wed Sep 28 06:58:50 2022 ] 	Mean test loss of 258 batches: 0.5482498488222906.
[ Wed Sep 28 06:58:50 2022 ] 	Top1: 83.94%
[ Wed Sep 28 06:58:50 2022 ] 	Top5: 97.09%
[ Wed Sep 28 06:58:50 2022 ] Training epoch: 78
[ Wed Sep 28 07:01:59 2022 ] 	Mean training loss: 0.4400. loss2: 0.0000. Mean training acc: 86.08%.
[ Wed Sep 28 07:01:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:01:59 2022 ] Eval epoch: 78
[ Wed Sep 28 07:02:28 2022 ] 	Mean test loss of 258 batches: 0.7198580495377843.
[ Wed Sep 28 07:02:28 2022 ] 	Top1: 78.79%
[ Wed Sep 28 07:02:28 2022 ] 	Top5: 96.51%
[ Wed Sep 28 07:02:28 2022 ] Training epoch: 79
[ Wed Sep 28 07:05:37 2022 ] 	Mean training loss: 0.4382. loss2: 0.0000. Mean training acc: 86.28%.
[ Wed Sep 28 07:05:37 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:05:37 2022 ] Eval epoch: 79
[ Wed Sep 28 07:06:06 2022 ] 	Mean test loss of 258 batches: 0.7160704853229745.
[ Wed Sep 28 07:06:06 2022 ] 	Top1: 79.64%
[ Wed Sep 28 07:06:07 2022 ] 	Top5: 96.05%
[ Wed Sep 28 07:06:07 2022 ] Training epoch: 80
[ Wed Sep 28 07:09:16 2022 ] 	Mean training loss: 0.4465. loss2: 0.0000. Mean training acc: 86.00%.
[ Wed Sep 28 07:09:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:09:16 2022 ] Eval epoch: 80
[ Wed Sep 28 07:09:45 2022 ] 	Mean test loss of 258 batches: 0.5858870182157487.
[ Wed Sep 28 07:09:45 2022 ] 	Top1: 82.99%
[ Wed Sep 28 07:09:45 2022 ] 	Top5: 96.58%
[ Wed Sep 28 07:09:45 2022 ] Training epoch: 81
[ Wed Sep 28 07:12:55 2022 ] 	Mean training loss: 0.4383. loss2: 0.0000. Mean training acc: 86.08%.
[ Wed Sep 28 07:12:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:12:55 2022 ] Eval epoch: 81
[ Wed Sep 28 07:13:25 2022 ] 	Mean test loss of 258 batches: 0.8263732926097027.
[ Wed Sep 28 07:13:25 2022 ] 	Top1: 76.45%
[ Wed Sep 28 07:13:25 2022 ] 	Top5: 95.88%
[ Wed Sep 28 07:13:25 2022 ] Training epoch: 82
[ Wed Sep 28 07:16:33 2022 ] 	Mean training loss: 0.4370. loss2: 0.0000. Mean training acc: 86.07%.
[ Wed Sep 28 07:16:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:16:33 2022 ] Eval epoch: 82
[ Wed Sep 28 07:17:03 2022 ] 	Mean test loss of 258 batches: 0.6873535840779312.
[ Wed Sep 28 07:17:03 2022 ] 	Top1: 80.29%
[ Wed Sep 28 07:17:03 2022 ] 	Top5: 96.25%
[ Wed Sep 28 07:17:03 2022 ] Training epoch: 83
[ Wed Sep 28 07:20:11 2022 ] 	Mean training loss: 0.4396. loss2: 0.0000. Mean training acc: 86.13%.
[ Wed Sep 28 07:20:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:20:11 2022 ] Eval epoch: 83
[ Wed Sep 28 07:20:41 2022 ] 	Mean test loss of 258 batches: 0.5747178297403247.
[ Wed Sep 28 07:20:41 2022 ] 	Top1: 82.85%
[ Wed Sep 28 07:20:41 2022 ] 	Top5: 97.17%
[ Wed Sep 28 07:20:41 2022 ] Training epoch: 84
[ Wed Sep 28 07:23:49 2022 ] 	Mean training loss: 0.4493. loss2: 0.0000. Mean training acc: 85.84%.
[ Wed Sep 28 07:23:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:23:49 2022 ] Eval epoch: 84
[ Wed Sep 28 07:24:19 2022 ] 	Mean test loss of 258 batches: 0.61921645839547.
[ Wed Sep 28 07:24:19 2022 ] 	Top1: 81.66%
[ Wed Sep 28 07:24:19 2022 ] 	Top5: 96.60%
[ Wed Sep 28 07:24:19 2022 ] Training epoch: 85
[ Wed Sep 28 07:27:27 2022 ] 	Mean training loss: 0.4393. loss2: 0.0000. Mean training acc: 86.22%.
[ Wed Sep 28 07:27:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:27:28 2022 ] Eval epoch: 85
[ Wed Sep 28 07:27:57 2022 ] 	Mean test loss of 258 batches: 0.5432397793660793.
[ Wed Sep 28 07:27:57 2022 ] 	Top1: 84.00%
[ Wed Sep 28 07:27:57 2022 ] 	Top5: 97.37%
[ Wed Sep 28 07:27:57 2022 ] Training epoch: 86
[ Wed Sep 28 07:31:05 2022 ] 	Mean training loss: 0.4374. loss2: 0.0000. Mean training acc: 86.20%.
[ Wed Sep 28 07:31:05 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:31:06 2022 ] Eval epoch: 86
[ Wed Sep 28 07:31:35 2022 ] 	Mean test loss of 258 batches: 0.5733217934305354.
[ Wed Sep 28 07:31:35 2022 ] 	Top1: 82.93%
[ Wed Sep 28 07:31:35 2022 ] 	Top5: 97.11%
[ Wed Sep 28 07:31:35 2022 ] Training epoch: 87
[ Wed Sep 28 07:34:43 2022 ] 	Mean training loss: 0.4367. loss2: 0.0000. Mean training acc: 86.31%.
[ Wed Sep 28 07:34:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:34:43 2022 ] Eval epoch: 87
[ Wed Sep 28 07:35:12 2022 ] 	Mean test loss of 258 batches: 0.7869017340185106.
[ Wed Sep 28 07:35:12 2022 ] 	Top1: 78.19%
[ Wed Sep 28 07:35:13 2022 ] 	Top5: 94.69%
[ Wed Sep 28 07:35:13 2022 ] Training epoch: 88
[ Wed Sep 28 07:38:21 2022 ] 	Mean training loss: 0.4352. loss2: 0.0000. Mean training acc: 86.14%.
[ Wed Sep 28 07:38:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:38:21 2022 ] Eval epoch: 88
[ Wed Sep 28 07:38:50 2022 ] 	Mean test loss of 258 batches: 0.7323342990274577.
[ Wed Sep 28 07:38:50 2022 ] 	Top1: 78.75%
[ Wed Sep 28 07:38:50 2022 ] 	Top5: 95.70%
[ Wed Sep 28 07:38:50 2022 ] Training epoch: 89
[ Wed Sep 28 07:41:58 2022 ] 	Mean training loss: 0.4426. loss2: 0.0000. Mean training acc: 86.00%.
[ Wed Sep 28 07:41:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:41:58 2022 ] Eval epoch: 89
[ Wed Sep 28 07:42:28 2022 ] 	Mean test loss of 258 batches: 1.2641661833654079.
[ Wed Sep 28 07:42:28 2022 ] 	Top1: 69.60%
[ Wed Sep 28 07:42:28 2022 ] 	Top5: 91.77%
[ Wed Sep 28 07:42:28 2022 ] Training epoch: 90
[ Wed Sep 28 07:45:36 2022 ] 	Mean training loss: 0.4349. loss2: 0.0000. Mean training acc: 86.42%.
[ Wed Sep 28 07:45:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:45:36 2022 ] Eval epoch: 90
[ Wed Sep 28 07:46:06 2022 ] 	Mean test loss of 258 batches: 0.6530224714399309.
[ Wed Sep 28 07:46:06 2022 ] 	Top1: 80.72%
[ Wed Sep 28 07:46:06 2022 ] 	Top5: 96.22%
[ Wed Sep 28 07:46:06 2022 ] Training epoch: 91
[ Wed Sep 28 07:49:14 2022 ] 	Mean training loss: 0.2525. loss2: 0.0000. Mean training acc: 92.33%.
[ Wed Sep 28 07:49:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:49:14 2022 ] Eval epoch: 91
[ Wed Sep 28 07:49:44 2022 ] 	Mean test loss of 258 batches: 0.3468180874636931.
[ Wed Sep 28 07:49:44 2022 ] 	Top1: 89.43%
[ Wed Sep 28 07:49:44 2022 ] 	Top5: 98.38%
[ Wed Sep 28 07:49:44 2022 ] Training epoch: 92
[ Wed Sep 28 07:52:52 2022 ] 	Mean training loss: 0.1957. loss2: 0.0000. Mean training acc: 94.00%.
[ Wed Sep 28 07:52:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:52:52 2022 ] Eval epoch: 92
[ Wed Sep 28 07:53:22 2022 ] 	Mean test loss of 258 batches: 0.3271688775325468.
[ Wed Sep 28 07:53:22 2022 ] 	Top1: 89.86%
[ Wed Sep 28 07:53:22 2022 ] 	Top5: 98.47%
[ Wed Sep 28 07:53:22 2022 ] Training epoch: 93
[ Wed Sep 28 07:56:30 2022 ] 	Mean training loss: 0.1737. loss2: 0.0000. Mean training acc: 94.77%.
[ Wed Sep 28 07:56:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:56:30 2022 ] Eval epoch: 93
[ Wed Sep 28 07:56:59 2022 ] 	Mean test loss of 258 batches: 0.3328651728238477.
[ Wed Sep 28 07:57:00 2022 ] 	Top1: 89.94%
[ Wed Sep 28 07:57:00 2022 ] 	Top5: 98.47%
[ Wed Sep 28 07:57:00 2022 ] Training epoch: 94
[ Wed Sep 28 08:00:08 2022 ] 	Mean training loss: 0.1571. loss2: 0.0000. Mean training acc: 95.28%.
[ Wed Sep 28 08:00:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:00:08 2022 ] Eval epoch: 94
[ Wed Sep 28 08:00:37 2022 ] 	Mean test loss of 258 batches: 0.3290264795916949.
[ Wed Sep 28 08:00:37 2022 ] 	Top1: 90.14%
[ Wed Sep 28 08:00:38 2022 ] 	Top5: 98.39%
[ Wed Sep 28 08:00:38 2022 ] Training epoch: 95
[ Wed Sep 28 08:03:46 2022 ] 	Mean training loss: 0.1447. loss2: 0.0000. Mean training acc: 95.68%.
[ Wed Sep 28 08:03:46 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:03:46 2022 ] Eval epoch: 95
[ Wed Sep 28 08:04:16 2022 ] 	Mean test loss of 258 batches: 0.33491964639677096.
[ Wed Sep 28 08:04:16 2022 ] 	Top1: 90.14%
[ Wed Sep 28 08:04:16 2022 ] 	Top5: 98.40%
[ Wed Sep 28 08:04:16 2022 ] Training epoch: 96
[ Wed Sep 28 08:07:24 2022 ] 	Mean training loss: 0.1342. loss2: 0.0000. Mean training acc: 96.08%.
[ Wed Sep 28 08:07:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:07:24 2022 ] Eval epoch: 96
[ Wed Sep 28 08:07:54 2022 ] 	Mean test loss of 258 batches: 0.34144062290415744.
[ Wed Sep 28 08:07:54 2022 ] 	Top1: 90.02%
[ Wed Sep 28 08:07:54 2022 ] 	Top5: 98.43%
[ Wed Sep 28 08:07:54 2022 ] Training epoch: 97
[ Wed Sep 28 08:11:02 2022 ] 	Mean training loss: 0.1238. loss2: 0.0000. Mean training acc: 96.43%.
[ Wed Sep 28 08:11:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:11:02 2022 ] Eval epoch: 97
[ Wed Sep 28 08:11:32 2022 ] 	Mean test loss of 258 batches: 0.3415847820655782.
[ Wed Sep 28 08:11:32 2022 ] 	Top1: 90.03%
[ Wed Sep 28 08:11:32 2022 ] 	Top5: 98.51%
[ Wed Sep 28 08:11:32 2022 ] Training epoch: 98
[ Wed Sep 28 08:14:40 2022 ] 	Mean training loss: 0.1186. loss2: 0.0000. Mean training acc: 96.67%.
[ Wed Sep 28 08:14:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:14:40 2022 ] Eval epoch: 98
[ Wed Sep 28 08:15:09 2022 ] 	Mean test loss of 258 batches: 0.35121652941659903.
[ Wed Sep 28 08:15:09 2022 ] 	Top1: 89.68%
[ Wed Sep 28 08:15:09 2022 ] 	Top5: 98.39%
[ Wed Sep 28 08:15:09 2022 ] Training epoch: 99
[ Wed Sep 28 08:18:18 2022 ] 	Mean training loss: 0.1149. loss2: 0.0000. Mean training acc: 96.63%.
[ Wed Sep 28 08:18:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:18:18 2022 ] Eval epoch: 99
[ Wed Sep 28 08:18:48 2022 ] 	Mean test loss of 258 batches: 0.3438698979412285.
[ Wed Sep 28 08:18:48 2022 ] 	Top1: 89.87%
[ Wed Sep 28 08:18:48 2022 ] 	Top5: 98.42%
[ Wed Sep 28 08:18:48 2022 ] Training epoch: 100
[ Wed Sep 28 08:21:57 2022 ] 	Mean training loss: 0.1038. loss2: 0.0000. Mean training acc: 97.04%.
[ Wed Sep 28 08:21:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:21:58 2022 ] Eval epoch: 100
[ Wed Sep 28 08:22:27 2022 ] 	Mean test loss of 258 batches: 0.3555631667029026.
[ Wed Sep 28 08:22:27 2022 ] 	Top1: 89.94%
[ Wed Sep 28 08:22:27 2022 ] 	Top5: 98.45%
[ Wed Sep 28 08:22:27 2022 ] Training epoch: 101
[ Wed Sep 28 08:25:35 2022 ] 	Mean training loss: 0.0843. loss2: 0.0000. Mean training acc: 97.70%.
[ Wed Sep 28 08:25:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:25:35 2022 ] Eval epoch: 101
[ Wed Sep 28 08:26:05 2022 ] 	Mean test loss of 258 batches: 0.3418828303701887.
[ Wed Sep 28 08:26:05 2022 ] 	Top1: 90.29%
[ Wed Sep 28 08:26:05 2022 ] 	Top5: 98.52%
[ Wed Sep 28 08:26:05 2022 ] Training epoch: 102
[ Wed Sep 28 08:29:13 2022 ] 	Mean training loss: 0.0780. loss2: 0.0000. Mean training acc: 97.94%.
[ Wed Sep 28 08:29:13 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:29:13 2022 ] Eval epoch: 102
[ Wed Sep 28 08:29:43 2022 ] 	Mean test loss of 258 batches: 0.34221560256772265.
[ Wed Sep 28 08:29:43 2022 ] 	Top1: 90.28%
[ Wed Sep 28 08:29:43 2022 ] 	Top5: 98.50%
[ Wed Sep 28 08:29:43 2022 ] Training epoch: 103
[ Wed Sep 28 08:32:52 2022 ] 	Mean training loss: 0.0683. loss2: 0.0000. Mean training acc: 98.33%.
[ Wed Sep 28 08:32:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:32:52 2022 ] Eval epoch: 103
[ Wed Sep 28 08:33:21 2022 ] 	Mean test loss of 258 batches: 0.34163488825970845.
[ Wed Sep 28 08:33:21 2022 ] 	Top1: 90.24%
[ Wed Sep 28 08:33:21 2022 ] 	Top5: 98.50%
[ Wed Sep 28 08:33:21 2022 ] Training epoch: 104
[ Wed Sep 28 08:36:29 2022 ] 	Mean training loss: 0.0678. loss2: 0.0000. Mean training acc: 98.45%.
[ Wed Sep 28 08:36:29 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:36:29 2022 ] Eval epoch: 104
[ Wed Sep 28 08:36:59 2022 ] 	Mean test loss of 258 batches: 0.34277743540790884.
[ Wed Sep 28 08:36:59 2022 ] 	Top1: 90.36%
[ Wed Sep 28 08:36:59 2022 ] 	Top5: 98.50%
[ Wed Sep 28 08:36:59 2022 ] Training epoch: 105
[ Wed Sep 28 08:40:07 2022 ] 	Mean training loss: 0.0666. loss2: 0.0000. Mean training acc: 98.38%.
[ Wed Sep 28 08:40:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:40:07 2022 ] Eval epoch: 105
[ Wed Sep 28 08:40:36 2022 ] 	Mean test loss of 258 batches: 0.34543556013945925.
[ Wed Sep 28 08:40:37 2022 ] 	Top1: 90.20%
[ Wed Sep 28 08:40:37 2022 ] 	Top5: 98.44%
[ Wed Sep 28 08:40:37 2022 ] Training epoch: 106
[ Wed Sep 28 08:43:45 2022 ] 	Mean training loss: 0.0653. loss2: 0.0000. Mean training acc: 98.47%.
[ Wed Sep 28 08:43:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:43:45 2022 ] Eval epoch: 106
[ Wed Sep 28 08:44:14 2022 ] 	Mean test loss of 258 batches: 0.3455648514940295.
[ Wed Sep 28 08:44:15 2022 ] 	Top1: 90.25%
[ Wed Sep 28 08:44:15 2022 ] 	Top5: 98.51%
[ Wed Sep 28 08:44:15 2022 ] Training epoch: 107
[ Wed Sep 28 08:47:23 2022 ] 	Mean training loss: 0.0645. loss2: 0.0000. Mean training acc: 98.44%.
[ Wed Sep 28 08:47:23 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 08:47:23 2022 ] Eval epoch: 107
[ Wed Sep 28 08:47:52 2022 ] 	Mean test loss of 258 batches: 0.3448248983165899.
[ Wed Sep 28 08:47:52 2022 ] 	Top1: 90.36%
[ Wed Sep 28 08:47:52 2022 ] 	Top5: 98.44%
[ Wed Sep 28 08:47:52 2022 ] Training epoch: 108
[ Wed Sep 28 08:51:01 2022 ] 	Mean training loss: 0.0612. loss2: 0.0000. Mean training acc: 98.58%.
[ Wed Sep 28 08:51:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:51:01 2022 ] Eval epoch: 108
[ Wed Sep 28 08:51:30 2022 ] 	Mean test loss of 258 batches: 0.347466539021206.
[ Wed Sep 28 08:51:30 2022 ] 	Top1: 90.31%
[ Wed Sep 28 08:51:30 2022 ] 	Top5: 98.45%
[ Wed Sep 28 08:51:30 2022 ] Training epoch: 109
[ Wed Sep 28 08:54:38 2022 ] 	Mean training loss: 0.0604. loss2: 0.0000. Mean training acc: 98.62%.
[ Wed Sep 28 08:54:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:54:38 2022 ] Eval epoch: 109
[ Wed Sep 28 08:55:07 2022 ] 	Mean test loss of 258 batches: 0.34761097256902806.
[ Wed Sep 28 08:55:07 2022 ] 	Top1: 90.29%
[ Wed Sep 28 08:55:07 2022 ] 	Top5: 98.39%
[ Wed Sep 28 08:55:07 2022 ] Training epoch: 110
[ Wed Sep 28 08:58:15 2022 ] 	Mean training loss: 0.0568. loss2: 0.0000. Mean training acc: 98.69%.
[ Wed Sep 28 08:58:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:58:15 2022 ] Eval epoch: 110
[ Wed Sep 28 08:58:44 2022 ] 	Mean test loss of 258 batches: 0.3475460009432809.
[ Wed Sep 28 08:58:44 2022 ] 	Top1: 90.23%
[ Wed Sep 28 08:58:44 2022 ] 	Top5: 98.49%
[ Wed Sep 28 08:59:14 2022 ] Best accuracy: 0.9036210347546552
[ Wed Sep 28 08:59:14 2022 ] Epoch number: 104
[ Wed Sep 28 08:59:14 2022 ] Model name: work_dir/ntu60/csub/fc_bone
[ Wed Sep 28 08:59:14 2022 ] Model total number of params: 2082097
[ Wed Sep 28 08:59:14 2022 ] Weight decay: 0.0004
[ Wed Sep 28 08:59:14 2022 ] Base LR: 0.1
[ Wed Sep 28 08:59:14 2022 ] Batch Size: 64
[ Wed Sep 28 08:59:14 2022 ] Test Batch Size: 64
[ Wed Sep 28 08:59:14 2022 ] seed: 1
